Direct Sum
Definition : Let V V V be a vector space and let M 1 , M 2 , … , M k M_1, M_2, \ldots, M_k M 1 ​ , M 2 ​ , … , M k ​ be subspaces of V V V . The sum of the subspaces M M M is defined as
M = { m = m 1 + m 2 + … + m k ∣ m i ∈ M i , i = 1 , 2 , … , k } M = \{m = m_1 + m_2 + \ldots + m_k \mid m_i \in M_i, i=1,2,\ldots,k\} M = { m = m 1 ​ + m 2 ​ + … + m k ​ ∣ m i ​ ∈ M i ​ , i = 1 , 2 , … , k }
Theorem : The sum of subspaces M M M is a subspace of V V V .
Proof :
Let x, y ∈ M \in M ∈ M Then,
x = m 1 + m 2 + … + m k m_1 + m_2 + \ldots + m_k m 1 ​ + m 2 ​ + … + m k ​
y = m 1 ˉ + m 2 ˉ + … + m k ˉ \bar{m_1} + \bar{m_2} + \ldots + \bar{m_k} m 1 ​ ˉ ​ + m 2 ​ ˉ ​ + … + m k ​ ˉ ​
α x + β y = α ( m 1 + m 2 + … + m k ) + β ( m 1 ˉ + m 2 ˉ + … + m k ˉ ) \alpha x + \beta y = \alpha(m_1 + m_2 + \ldots + m_k) + \beta(\bar{m_1} + \bar{m_2} + \ldots + \bar{m_k}) αx + β y = α ( m 1 ​ + m 2 ​ + … + m k ​ ) + β ( m 1 ​ ˉ ​ + m 2 ​ ˉ ​ + … + m k ​ ˉ ​ )
α x + β y = ( α m 1 + β m 1 ˉ ) + ( α m 2 + β m 2 ˉ ) + … + ( α m k + β m k ˉ ) \alpha x + \beta y = (\alpha m_1 + \beta \bar{m_1}) + (\alpha m_2 + \beta \bar{m_2}) + \ldots + (\alpha m_k + \beta \bar{m_k}) αx + β y = ( α m 1 ​ + β m 1 ​ ˉ ​ ) + ( α m 2 ​ + β m 2 ​ ˉ ​ ) + … + ( α m k ​ + β m k ​ ˉ ​ )
Since α m i + β m i ˉ ∈ M i \alpha m_i + \beta \bar{m_i} \in M_i α m i ​ + β m i ​ ˉ ​ ∈ M i ​ for i = 1 , 2 , … , k i=1,2,\ldots,k i = 1 , 2 , … , k , we have α x + β y ∈ M \alpha x + \beta y \in M αx + β y ∈ M .
Remark : Let V = M 1 + M 2 + … + M k V=M_1+M_2+\ldots+M_k V = M 1 ​ + M 2 ​ + … + M k ​ and let M 1 , M 2 … , M k M_1, M_2 \ldots, M_k M 1 ​ , M 2 ​ … , M k ​ are linearly independent. Let x ∈ V x \in V x ∈ V
x = m 1 + m 2 + … + m k x = m_1 + m_2 + \ldots + m_k x = m 1 ​ + m 2 ​ + … + m k ​
where m i ∈ M i  for i = 1 , 2 , … , k \text{where $m_i \in M_i$ for $i=1,2,\ldots,k$}
where m i ​ ∈ M i ​  for i = 1 , 2 , … , k m 1 + m 2 + … + m k = 0    ⟹    m 1 = m 2 = … = m k = 0 m_1 + m_2 + \ldots + m_k = 0 \implies m_1 = m_2 = \ldots = m_k = 0 m 1 ​ + m 2 ​ + … + m k ​ = 0 ⟹ m 1 ​ = m 2 ​ = … = m k ​ = 0
since M 1 , M 2 … , M k  are linearly independent \text{since $M_1, M_2 \ldots, M_k$ are linearly independent}
since M 1 ​ , M 2 ​ … , M k ​  are linearly independent
Definition : Let M 1 , M 2 , … , M k M_1, M_2, \ldots, M_k M 1 ​ , M 2 ​ , … , M k ​ be subspaces of V V V .
i) M = M 1 + M 2 + … + M k ii) M 1 , M 2 , … , M k  are linearly independent \begin{align*}
&\text{i) $M = M_1 + M_2 + \ldots + M_k$} \\
&\text{ii) $M_1, M_2, \ldots, M_k$ are linearly independent}
\end{align*} ​ i) M = M 1 ​ + M 2 ​ + … + M k ​ ii) M 1 ​ , M 2 ​ , … , M k ​  are linearly independent ​ Then, M M M is called the direct sum of M 1 , M 2 , … , M k M_1, M_2, \ldots, M_k M 1 ​ , M 2 ​ , … , M k ​ and we write M = M 1 ⊕ M 2 ⊕ … ⊕ M k M = M_1 \oplus M_2 \oplus \ldots \oplus M_k M = M 1 ​ ⊕ M 2 ​ ⊕ … ⊕ M k ​ .
When we have a direct sum, summation of dimensions of subspaces is equal to the dimension of the direct sum.
Example : V = R 4 V = \mathbb{R}^4 V = R 4   x = [ x 1 x 2 x 3 x 4 ] ∈ R 4 x = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{bmatrix} \in \mathbb{R}^4 x = ⎣ ⎡ ​ x 1 ​ x 2 ​ x 3 ​ x 4 ​ ​ ⎦ ⎤ ​ ∈ R 4
M 1 = { x ∈ R 4 ∣ x 3 = x 4 = 0 } M_1 = \{x \in \mathbb{R}^4 \mid x_3 = x_4 = 0\} M 1 ​ = { x ∈ R 4 ∣ x 3 ​ = x 4 ​ = 0 } dimension of M 1 M_1 M 1 ​ is 2
M 2 = { x ∈ R 4 ∣ x 1 = x 2 = 0 } M_2 = \{x \in \mathbb{R}^4 \mid x_1 = x_2 = 0\} M 2 ​ = { x ∈ R 4 ∣ x 1 ​ = x 2 ​ = 0 } dimension of M 2 M_2 M 2 ​ is 2
M 3 = { x ∈ R 4 ∣ x 1 = 0 } M_3 = \{x \in \mathbb{R}^4 \mid x_1 = 0\} M 3 ​ = { x ∈ R 4 ∣ x 1 ​ = 0 } dimension of M 3 M_3 M 3 ​ is 3
M 1 + M 2 = R 4 M_1 + M_2 = \mathbb{R}^4 M 1 ​ + M 2 ​ = R 4
M 1 ⊕ M 2 = R 4 M_1 \oplus M_2 = \mathbb{R}^4 M 1 ​ ⊕ M 2 ​ = R 4
Definition : If M = V M = V M = V , then M = M 1 ⊕ M 2 ⊕ … ⊕ M k M = M_1 \oplus M_2 \oplus \ldots \oplus M_k M = M 1 ​ ⊕ M 2 ​ ⊕ … ⊕ M k ​ is called a direct sum decomposition of V V V .
Remark : Let M = M 1 ⊕ M 2 ⊕ … ⊕ M k M = M_1 \oplus M_2 \oplus \ldots \oplus M_k M = M 1 ​ ⊕ M 2 ​ ⊕ … ⊕ M k ​ be a direct sum decomposition of V V V . Then, the decomposition is unique.
Proof : Let M = M 1 ⊕ M 2 ⊕ … ⊕ M k M = M_1 \oplus M_2 \oplus \ldots \oplus M_k M = M 1 ​ ⊕ M 2 ​ ⊕ … ⊕ M k ​ and M = M 1 ˉ ⊕ M 2 ˉ ⊕ … ⊕ M k ˉ M = \bar{M_1} \oplus \bar{M_2} \oplus \ldots \oplus \bar{M_k} M = M 1 ​ ˉ ​ ⊕ M 2 ​ ˉ ​ ⊕ … ⊕ M k ​ ˉ ​ be two direct sum decompositions of V V V . Then,
i) M = M 1 + M 2 + … + M k ii) M 1 , M 2 , … , M k  are linearly independent iii) M = M 1 ˉ + M 2 ˉ + … + M k ˉ iv) M 1 ˉ , M 2 ˉ , … , M k ˉ  are linearly independent \begin{align*}
&\text{i) $M = M_1 + M_2 + \ldots + M_k$} \\
&\text{ii) $M_1, M_2, \ldots, M_k$ are linearly independent} \\
&\text{iii) $M = \bar{M_1} + \bar{M_2} + \ldots + \bar{M_k}$} \\
&\text{iv) $\bar{M_1}, \bar{M_2}, \ldots, \bar{M_k}$ are linearly independent}
\end{align*} ​ i) M = M 1 ​ + M 2 ​ + … + M k ​ ii) M 1 ​ , M 2 ​ , … , M k ​  are linearly independent iii) M = M 1 ​ ˉ ​ + M 2 ​ ˉ ​ + … + M k ​ ˉ ​ iv) M 1 ​ ˉ ​ , M 2 ​ ˉ ​ , … , M k ​ ˉ ​  are linearly independent ​ Let x ∈ M 1 ∩ M 1 ˉ x \in M_1 \cap \bar{M_1} x ∈ M 1 ​ ∩ M 1 ​ ˉ ​ , then x ∈ M 1 x \in M_1 x ∈ M 1 ​ and x ∈ M 1 ˉ x \in \bar{M_1} x ∈ M 1 ​ ˉ ​ . Since M 1 ⊕ M 1 ˉ M_1 \oplus \bar{M_1} M 1 ​ ⊕ M 1 ​ ˉ ​ , we have x = 0 x = 0 x = 0 .
Let x ∈ M 2 ∩ M 2 ˉ x \in M_2 \cap \bar{M_2} x ∈ M 2 ​ ∩ M 2 ​ ˉ ​ , then x ∈ M 2 x \in M_2 x ∈ M 2 ​ and x ∈ M 2 ˉ x \in \bar{M_2} x ∈ M 2 ​ ˉ ​ . Since M 2 ⊕ M 2 ˉ M_2 \oplus \bar{M_2} M 2 ​ ⊕ M 2 ​ ˉ ​ , we have x = 0 x = 0 x = 0 .
â‹® \vdots â‹®
Let x ∈ M k ∩ M k ˉ x \in M_k \cap \bar{M_k} x ∈ M k ​ ∩ M k ​ ˉ ​ , then x ∈ M k x \in M_k x ∈ M k ​ and x ∈ M k ˉ x \in \bar{M_k} x ∈ M k ​ ˉ ​ . Since M k ⊕ M k ˉ M_k \oplus \bar{M_k} M k ​ ⊕ M k ​ ˉ ​ , we have x = 0 x = 0 x = 0 .
Since x = 0 x = 0 x = 0 for all x ∈ M 1 ∩ M 1 ˉ , M 2 ∩ M 2 ˉ , … , M k ∩ M k ˉ x \in M_1 \cap \bar{M_1}, M_2 \cap \bar{M_2}, \ldots, M_k \cap \bar{M_k} x ∈ M 1 ​ ∩ M 1 ​ ˉ ​ , M 2 ​ ∩ M 2 ​ ˉ ​ , … , M k ​ ∩ M k ​ ˉ ​ , we have M 1 = M 1 ˉ , M 2 = M 2 ˉ , … , M k = M k ˉ M_1 = \bar{M_1}, M_2 = \bar{M_2}, \ldots, M_k = \bar{M_k} M 1 ​ = M 1 ​ ˉ ​ , M 2 ​ = M 2 ​ ˉ ​ , … , M k ​ = M k ​ ˉ ​ .
Definition : Let V V V be an inner product space and let M 1 , M 2 , … , M k M_1, M_2, \ldots, M_k M 1 ​ , M 2 ​ , … , M k ​ be subspaces of V V V . The orthogonal sum of the subspaces M M M is defined as
< m 1 , m 2 > = 0 ∀  m 1 ∈ M 1  and m 2 ∈ M 2 <m_1, m_2> = 0 \quad \forall\text{ $m_1 \in M_1$ and $m_2 \in M_2$}
< m 1 ​ , m 2 ​ >= 0 ∀  m 1 ​ ∈ M 1 ​  and m 2 ​ ∈ M 2 ​ Orthogonality is denoted by M 1 ⊥ M 2 M_1 \perp M_2 M 1 ​ ⊥ M 2 ​ .
Definition : Let M = M 1 ⊕ M 2 ⊕ … ⊕ M k M = M_1 \oplus M_2 \oplus \ldots \oplus M_k M = M 1 ​ ⊕ M 2 ​ ⊕ … ⊕ M k ​ and let M 1 ⊥ M 2 ⊥ … ⊥ M k M_1 \perp M_2 \perp \ldots \perp M_k M 1 ​ ⊥ M 2 ​ ⊥ … ⊥ M k ​ . Then, M M M is called the orthogonal direct sum of M 1 , M 2 , … , M k M_1, M_2, \ldots, M_k M 1 ​ , M 2 ​ , … , M k ​ and we write M = M 1 ⊥ ⊕ M 2 ⊥ ⊕ … ⊥ ⊕ M k M = M_1 \substack{\perp \\ \oplus} M_2 \substack{\perp \\ \oplus} \ldots \substack{\perp \\ \oplus} M_k M = M 1 ​ ⊥ ⊕ ​ M 2 ​ ⊥ ⊕ ​ … ⊥ ⊕ ​ M k ​ .
Definition : Let M = M 1 ⊕ M 2 ⊕ … ⊕ M k M = M_1 \oplus M_2 \oplus \ldots \oplus M_k M = M 1 ​ ⊕ M 2 ​ ⊕ … ⊕ M k ​ be a direct sum decomposition of V V V . Then, the orthogonal complement of M i M_i M i ​ is defined as
M i ⊥ = { x ∈ V ∣ < x , m i > = 0 ∀  m i ∈ M i } M_i^{\perp} = \{x \in V \mid <x, m_i> = 0 \quad \forall\text{ $m_i \in M_i$}\}
M i ⊥ ​ = { x ∈ V ∣< x , m i ​ >= 0 ∀  m i ​ ∈ M i ​ }
Theorem : M i ⊥ M_i^{\perp} M i ⊥ ​ is a subspace of V V V .
Proof : x , y ∈ M ⊥ x,y \in M^{\perp} x , y ∈ M ⊥ then α x + β y ∈ M ⊥ \alpha x + \beta y \in M^{\perp} αx + β y ∈ M ⊥ should be shown.
0 ∈ M i ⊥ 0 \in M_i^{\perp} 0 ∈ M i ⊥ ​ since < 0 , m i > = 0 <0, m_i> = 0 < 0 , m i ​ >= 0 for all m i ∈ M i m_i \in M_i m i ​ ∈ M i ​ .
Let x , y ∈ M i ⊥ x, y \in M_i^{\perp} x , y ∈ M i ⊥ ​ and let α , β ∈ R \alpha, \beta \in \mathbb{R} α , β ∈ R . Then,
< α x + β y , m i > = α < x , m i > + β < y , m i > = 0 <\alpha x + \beta y, m_i> = \alpha<x, m_i> + \beta<y, m_i> = 0 < αx + β y , m i ​ >= α < x , m i ​ > + β < y , m i ​ >= 0
for all m i ∈ M i m_i \in M_i m i ​ ∈ M i ​ . Therefore, α x + β y ∈ M i ⊥ \alpha x + \beta y \in M_i^{\perp} αx + β y ∈ M i ⊥ ​ .
Example : V = R 3 V = \mathbb{R}^3 V = R 3   M = span { [ 0 − 1 1 ] , [ − 1 0 1 ] } M = \text{span}\left\{\begin{bmatrix} 0 \\ -1 \\ 1 \end{bmatrix}, \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix} \right\} M = span ⎩ ⎨ ⎧ ​ ⎣ ⎡ ​ 0 − 1 1 ​ ⎦ ⎤ ​ , ⎣ ⎡ ​ − 1 0 1 ​ ⎦ ⎤ ​ ⎠⎬ ⎫ ​   M ⊥ = ? M^{\perp} = ? M ⊥ = ?
Solution : Let x = [ x 1 x 2 x 3 ] ∈ M ⊥ x = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} \in M^{\perp} x = ⎣ ⎡ ​ x 1 ​ x 2 ​ x 3 ​ ​ ⎦ ⎤ ​ ∈ M ⊥
< x , m 1 > = 0 <x, m_1> = 0 < x , m 1 ​ >= 0   < x , m 2 > = 0 <x, m_2> = 0 < x , m 2 ​ >= 0
< x , [ 0 − 1 1 ] > = 0 <x, \begin{bmatrix} 0 \\ -1 \\ 1 \end{bmatrix}> = 0 < x , ⎣ ⎡ ​ 0 − 1 1 ​ ⎦ ⎤ ​ >= 0
x 2 − x 3 = 0 x_2 - x_3 = 0 x 2 ​ − x 3 ​ = 0
x 2 = x 3 x_2 = x_3 x 2 ​ = x 3 ​
< x , [ − 1 0 1 ] > = 0 <x, \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix}> = 0 < x , ⎣ ⎡ ​ − 1 0 1 ​ ⎦ ⎤ ​ >= 0
− x 1 + x 3 = 0 -x_1 + x_3 = 0 − x 1 ​ + x 3 ​ = 0
x 1 = x 3 x_1 = x_3 x 1 ​ = x 3 ​
x = [ x 1 x 2 x 3 ] = [ x 3 x 3 x 3 ] = x 3 [ 1 1 1 ] x = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} x_3 \\ x_3 \\ x_3 \end{bmatrix} = x_3 \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} x = ⎣ ⎡ ​ x 1 ​ x 2 ​ x 3 ​ ​ ⎦ ⎤ ​ = ⎣ ⎡ ​ x 3 ​ x 3 ​ x 3 ​ ​ ⎦ ⎤ ​ = x 3 ​ ⎣ ⎡ ​ 1 1 1 ​ ⎦ ⎤ ​
M ⊥ = span { [ 1 1 1 ] } M^{\perp} = \text{span}\left\{\begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}\right\} M ⊥ = span ⎩ ⎨ ⎧ ​ ⎣ ⎡ ​ 1 1 1 ​ ⎦ ⎤ ​ ⎠⎬ ⎫ ​
Theorem : Let V V V be an inner product space and let M M M be a subspace of V V V . Then, we can always write M ⊕ M ⊥ = V M \oplus M^{\perp} = V M ⊕ M ⊥ = V . That is V V V can always be written as the direct sum of a subspace and its orthogonal complement.
Proof : We need to show two things:
M M M and M ⊥ M^{\perp} M ⊥ are linearly independent.
Any x ∈ V x \in V x ∈ V can be written as x = m + m ⊥ x = m + m^{\perp} x = m + m ⊥ where m ∈ M m \in M m ∈ M and m ⊥ ∈ M ⊥ m^{\perp} \in M^{\perp} m ⊥ ∈ M ⊥ .
See lecture notes for the full proof.
#EE501 - Linear Systems Theory at METU